Natural language processing (NLP) is part of AI. It is the power of a computer virus to know the human language. It has a variety of real-world applications in some fields, including medical research, search engines, business intelligence, etc. It takes real-world input, processes it, and makes sense in a way computer can understand it. Like humans, computers have different sensors in form of programs to read and microphones for audio, and just like the brain, the computer uses programs for processing the inputs. At some point in processing, the input is converted to code that the pc can understand.
Libraries used for Sentimental Analysis
TextBlob is more of a
natural language processing library, but it comes with a rule-based sentiment analysis
library.
Polarity is a float that lies in the range of [-1,1] where 1 means a
positive statement and -1 means a negative statement. Subjective sentences generally
refer to personal opinion, emotion, or judgment whereas objective refers to factual
information. It ranges from [0,1]
Subjectivity is inversely proportional to accuracy so a lower score should likely denote
a more likely-to-be-accurate reading VADER (Valence Aware Dictionary and Sentiment
Reasoner) is a lexicon and rule-based sentiment analysis tool that is specifically
attuned to sentiments expressed in social media.
Vader sentiment returns the probability of a given input sentence to be positive,
negative, and neutral
positive sentiment: compound score >= 0.5
neutral sentiment: (compound score > -0.5) and (compound score < 0.5) negative
sentiment: compound score <=-0.5 The compound score is the sum of positive, negative
& neutral scores which is then normalized between -1(most extreme negative) and +1
(most extreme positive). The more Compound score closer to +1, the higher the
positivity of the text.
SQLite
- SQLite truly shines because it is extremely lightweight. Setting up an SQLite database is nearly instant, there is no server to set up, no users to define, and no permissions to concern yourself with. For this reason, it is often used as a developmental and protyping database, but it can and is used in production.
- The main issue with SQLite is that it winds up being much like any other flat file, so high volume input/output, especially with simultaneous queries, can be problematic and slow.
- A flat file will require a full load before you can start querying the full dataset, SQLite files don’t work that way. Finally, edits do not require the entire file to be re-saved, it’s just that part of the file. This improves performance significantly.
- I have used all-caps to denote SQL-specific commands since an SQL query contains both SQL elements and dynamic ones that you set. Since SQL queries are strings, sometimes they can be hard to debug without some sort of differentiation like this.SQLite is not blind to the casing, compared with MySQL.
- Each value stored in an SQLite database (or manipulated by the database engine) has one of the following storage classes:
- NULL: - The value is a NULL value.
- INTEGER: -The value is a signed integer, stored in 1, 2, 3, 4, 6, or 8 bytes depending on the magnitude of the value.
- REAL: -The value is a floating-point value, stored as an 8-byte IEEE floating-point number.
- TEXT: -The value is a text string, stored using the database encoding (UTF-8, UTF-16BE, or UTF-16LE).
- BLOB: -The value is a blob of data, stored exactly as it was input.
Tweepy
Tweepy makes it easier to
use the Twitter streaming API by handling authentication, connection, creating and
destroying the session, reading incoming messages, and partially routing messages.
tweepy.Stream establishes a streaming session and routes messages to StreamListener instance. The on_data method of a stream listener receives all messages and call functions according to the message type. The default StreamListener can classify the most common Twitter messages and routes them to appropriately named methods, but these methods are only stubs. OAuthHandler is used for authorization of the user, it takes the consumer key and screen and calls Listener class containing StreamListener.
- Create a class inheriting from StreamListener
- Using that class create a Stream object
- Connect to the Twitter API using the Stream.
How to generate Twitter API and access keys: -
- Go to Twitter
- Create a Developer Account, if u already have then login there and create a new app with all the details
- Enable OAuth to generate an access token and access secret
- Copy all the keys and use them for Twitter analysis
If clients exceed a limited number of attempts to connect to the streaming API in a window of time, they will receive error 420.
Streamlit
It is an open-source
Python library that makes it easy to create and share beautiful, custom web apps for
machine learning and data science.
Implementation
- Create a table with the help of SQLite
- Pass the secret keys and stream the data
- Filter out the precise keyword you want to search for the sentimental analysis and display its data
- From the sentiment score, we can determine whether the text is positive, negative, or neutral
To know the tweets of a particular user we have used cursor
Cursor
It is used in accessing
tweets from timelines for users. It allows us to stream data from a query without having
to manually type it.
- It is used in accessing tweets from timelines for users. It allows us to stream data from a query without having to manually type it.
- While looking at the data if the tweets are more negative then it directs the user to depression consultant form
- The data entered is stored in AWS Database and from there it is sent to the depression consultant for counseling
- To store data in AWS Database click here
DEVLOPERS

Harsh Viradia
Connect with
